Advances in robotic technology for hand rehabilitation, particularly soft robotic gloves, have significant potential to improve patient outcomes. While vision-based algorithms pave the way for fast and convenient hand pose estimation, most current models struggle to accurately track hand movements when soft robotic gloves are used, primarily due to severe occlusion. This limitation reduces the applicability of soft robotic gloves in digital and remote rehabilitation assessment. Furthermore, traditional clinical assessments like the Fugl-Meyer Assessment (FMA) rely on manual measurements and subjective scoring scales, lacking the efficiency and quantitative accuracy needed to monitor hand function recovery in data-driven personalised rehabilitation. Consequently, few integrated evaluation systems provide reliable quantitative assessments. In this work, we propose an RGB-based evaluation system for soft robotic glove applications, which is aimed at bridging these gaps in assessing hand function. By incorporating the Hand Mesh Reconstruction (HaMeR) model fine-tuned with motion capture data, our hand estimation framework overcomes occlusion and enables accurate continuous tracking of hand movements with reduced errors. The resulting functional metrics include conventional clinical benchmarks such as the mean per joint angle error (MPJAE) and range of motion (ROM), providing quantitative, consistent measures of rehabilitation progress and achieving tracking errors lower than 10◦. In addition, we introduce adapted benchmarks such as the angle percentage of correct keypoints (APCK), mean per joint angular velocity error (MPJAVE) and angular spectral arc length (SPARC) error to characterise movement stability and smoothness. This extensible and adaptable solution demonstrates the potential of vision-based systems for future clinical and home-based rehabilitation assessment.
Loading....